Playing Codenames with Language Graphs and Word Embeddings

نویسندگان

چکیده

Although board games and video have been studied for decades in artificial intelligence research, challenging word remain relatively unexplored. Word are not as constrained like chess or poker. Instead, game strategy is defined by the players’ understanding of way words relate to each other. The Codenames provides a unique opportunity investigate common sense relationships between words, an important open challenge. We propose algorithm that can generate clues from language graph BabelNet any several embedding methods – word2vec, GloVe, fastText BERT. introduce new scoring function measures quality clues, we weighting term called DETECT incorporates dictionary-based representations document frequency improve clue selection. develop BabelNet-Word Selection Framework (BabelNetWSF) overcome computational barriers previously prevented leveraging graphs Codenames. Extensive experiments with human evaluators demonstrate our proposed innovations yield state-of-the-art performance, up 102.8% improvement precision@2 some cases. Overall, this work advances formal study approaches understanding.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Language Models with GloVe Word Embeddings

In this work we present a step-by-step implementation of training a Language Model (LM) , using Recurrent Neural Network (RNN) and pre-trained GloVe word embeddings, introduced by Pennigton et al. in [1]. The implementation is following the general idea of training RNNs for LM tasks presented in [2] , but is rather using Gated Recurrent Unit (GRU) [3] for a memory cell, and not the more commonl...

متن کامل

Labeling Subgraph Embeddings and Cordiality of Graphs

Let $G$ be a graph with vertex set $V(G)$ and edge set $E(G)$, a vertex labeling $f : V(G)rightarrow mathbb{Z}_2$ induces an edge labeling $ f^{+} : E(G)rightarrow mathbb{Z}_2$ defined by $f^{+}(xy) = f(x) + f(y)$, for each edge $ xyin E(G)$.  For each $i in mathbb{Z}_2$, let $ v_{f}(i)=|{u in V(G) : f(u) = i}|$ and $e_{f^+}(i)=|{xyin E(G) : f^{+}(xy) = i}|$. A vertex labeling $f$ of a graph $G...

متن کامل

Playing with Embeddings : Evaluating embeddings for Robot Language Learning through MUD Games

Acquiring language provides a ubiquitous mode of communication, across humans and robots. To this effect, distributional representations of words based on cooccurrence statistics, have provided significant advancements ranging across machine translation to comprehension. In this paper, we study the suitability of using general purpose word-embeddings for language learning in robots. We propose ...

متن کامل

Playing with Word Meanings

With the wide-world expansion of the social web, subjectivity analysis became lately one of the main research focus in the area of intelligent information retrieval. Being able to find out what people feel about a specific topic, be it a marketed product, a public person or a political issue, represents a very interesting application for a large class of actors, from the everyday product and se...

متن کامل

Sub-Word Similarity based Search for Embeddings: Inducing Rare-Word Embeddings for Word Similarity Tasks and Language Modelling

Training good word embeddings requires large amounts of data. Out-of-vocabulary words will still be encountered at test-time, leaving these words without embeddings. To overcome this lack of embeddings for rare words, existing methods leverage morphological features to generate embeddings. While the existing methods use computationally-intensive rule-based (Soricut and Och, 2015) or tool-based ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Artificial Intelligence Research

سال: 2021

ISSN: ['1076-9757', '1943-5037']

DOI: https://doi.org/10.1613/jair.1.12665